• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) ´ÙÁß ÀÌÁ¾ NPU ÀåÄ¡¿¡¼­ ´ÙÁß ´º·²³ÝÀ» Ãß·ÐÇϱâ À§ÇÑ NPU ¿î¿µÃ¼Á¦ Ç÷§Æû °³¹ß
¿µ¹®Á¦¸ñ(English Title) Development of NPU Operating System Platform for Inference of Multiple Neural Networks on Multiple Heterogeneous NPU Devices
ÀúÀÚ(Author) ±è»óö   ±è¿ë¿¬   ±èÅÂÈ£   Sang Cheol Kim   Yong Yeon Kim   Taeho Kim  
¿ø¹®¼ö·Ïó(Citation) VOL 26 NO. 12 PP. 0561 ~ 0566 (2020. 12)
Çѱ۳»¿ë
(Korean Abstract)
ÃÖ±Ù¿¡ AI¸¦ ÀÌ¿ëÇÏ¿© À̹ÌÁö¸¦ ºü¸£°Ô Ãß·ÐÇϱâ À§ÇØ NPU¶ó´Â Àü¿ë µð¹ÙÀ̽º°¡ »ý±â±â ½ÃÀÛÇß´Ù. ÇöÀç¿¡´Â ÀÌ·¯ÇÑ NPU¸¦ ÀÌ¿ëÇÏ¿© ´ÜÀÏ ´º·²³ÝÀ» ´ÜÀÏ NPU¿¡¼­ Ãß·ÐÇÏ´Â ¼öÁØ¿¡ ¸Ó¹°·¯ ÀÖÀ¸³ª º¹¼öÀÇ Ãß·Ð ¿äûÀÌ ÀÖ´Â º¹¼öÀÇ ÀÀ¿ë ÇÁ·Î±×·¥À» ½ÇÇàÇÒ ¶§¿¡´Â ÀÌ¿Í´Â ´Ù¸£°Ô ¸Å¿ì º¹ÀâÇÑ ¹®Á¦¸¦ °í·ÁÇÏ¿©¾ß ÇÑ´Ù. ƯÈ÷ NPU´Â ½Ã½ºÅÛ Á¦Á¶»ç¸¶´Ù ÄÄÆÄÀÏ ¹æ½Äµµ ´Ù¸£°í ½ÇÇàȯ°æµµ ´Ù¸£°Ô Á¦°øµÇ¹Ç·Î Ãß·Ð ¹æ½ÄÀº ¹Ýµå½Ã ¿©·¯ Á¾·ùÀÇ NPU°¡ Á¸ÀçÇÒ ¼ö ÀÖ´Ù´Â °ÍÀ» °í·ÁÇÏ¿© Á¦°øµÇ¾î¾ß ÇÑ´Ù. º» ³í¹®¿¡¼­´Â ´Ù¾çÇÑ ÇüÅÂÀÇ NPU°¡ º¹¼ö °³°¡ Á¸ÀçÇÏ°í ´º·²³Ý ¸ðµ¨µµ º¹¼ö °³°¡ Á¦°øµÉ ¶§ Ãß·ÐÇÏ´Â NPU ¿î¿µÃ¼Á¦ Ç÷§ÆûÀ» Á¦¾ÈÇÑ´Ù. Á¦¾È ¹æ½ÄÀ» »ç¿ëÇÏ¸é ´Ù¾çÇÑ ÇüÅÂÀÇ NPU µð¹ÙÀ̽º¿¡¼­ º¹¼öÀÇ ´º·²³Ý Ãß·Ð ÀÀ¿ë ÇÁ·Î±×·¥µéÀÌ ½±°í, ºü¸£°í È¿À²ÀûÀ¸·Î ½ÇÇà ¹× Á¦¾îµÇ´Â °ÍÀÌ °¡´ÉÇÏ´Ù.
¿µ¹®³»¿ë
(English Abstract)
A special device called NPU has recently come into widespread use for inferring images using an AI technique. Currently, it remains at the level of inferring a single neural network from a single NPU using such an NPU, but when executing multiple application programs with multiple inference requests, complex issues occur which must be addressed. Specifically, since the compilation method and the execution environment for NPU devices differ between each system manufacturer, the inference mechanism must be provided while taking into account that there may be several types of NPUs. In this paper, we propose an NPU operating system platform that infers when multiple NPUs of various types exist and multiple neural net models are provided. The proposed method enables the easy, quick, and efficient execution and control of multiple neural net inference applications in various types of NPU devices.
Å°¿öµå(Keyword) ´º·² ³×Æ®¿öÅ©   ´º·² ÇÁ·Î¼¼½Ì ÀåÄ¡   ÀΰøÁö´É ¿î¿µÃ¼Á¦   ÀΰøÁö´É ¼ÒÇÁÆ®¿þ¾î Ç÷§Æû   neural network   neural processing device   artificial intelligence operating system   artificial intelligence software platform  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå